A Stable High-Order Tuner for General Convex Functions

نویسندگان

چکیده

Iterative gradient-based algorithms have been increasingly applied for the training of a broad variety machine learning models including large neural-nets. In particular, momentum-based methods, with accelerated guarantees, received lot attention due to their provable guarantees fast in certain classes problems and multiple derived. However, properties these methods hold only constant regressors. When time-varying regressors occur, which is commonplace dynamic systems, many cannot guarantee stability. Recently, new High-order Tuner (HT) was developed linear regression shown 1) stability asymptotic convergence 2) non-asymptotic this letter, we extend discuss results same HT general convex loss functions. Through exploitation convexity smoothness definitions, establish similar guarantees. Finally, provide numerical simulations supporting satisfactory behavior algorithm as well an property.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inequalities for discrete higher order convex functions

[1] E. BOROS AND A. PRÉKOPA, Closed Form Two-Sided Bounds for Probabilities That Exactly r and at Least r out of n Events Occur, Mathematics of Operations Research, 14 (1989), 317–342. [2] D. DAWSON AND A. SANKOFF, An Inequality for Probabilities, Proceedings of the American Mathematical Society, 18 (1967), 504–507. [3] H.P. EDMUNDSON, Bounds on the Expectation of a Convex Function of a Random ...

متن کامل

JENSEN’S INEQUALITY FOR GG-CONVEX FUNCTIONS

In this paper, we obtain Jensen’s inequality for GG-convex functions. Also, we get in- equalities alike to Hermite-Hadamard inequality for GG-convex functions. Some examples are given.

متن کامل

A General Theory of Almost Convex Functions

Let ∆m = {(t0, . . . , tm) ∈ Rm+1 : ti ≥ 0, ∑m i=0 ti = 1} be the standard m-dimensional simplex and let ∅ = S ⊂ ⋃∞ m=1 ∆m. Then a function h : C → R with domain a convex set in a real vector space is S-almost convex iff for all (t0, . . . , tm) ∈ S and x0, . . . , xm ∈ C the inequality h(t0x0 + · · ·+ tmxm) ≤ 1 + t0h(x0) + · · ·+ tmh(xm) holds. A detailed study of the properties of S-almost co...

متن کامل

A Convex Surrogate Operator for General Non-Modular Loss Functions

Empirical risk minimization frequently employs convex surrogates to underlying discrete loss functions in order to achieve computational tractability during optimization. However, classical convex surrogates can only tightly bound modular loss functions, submodular functions or supermodular functions separately while maintaining polynomial time computation. In this work, a novel generic convex ...

متن کامل

A New Type of Stable Generalized Convex Functions

S-quasiconvex functions (Phu and An, Optimization, Vol. 38, 1996) are stable with respect to the properties: “every lower level set is convex", “each local minimizer is a global minimizer", and “each stationary point is a global minimizer" (i.e., these properties remain true if a sufficiently small linear disturbance is added to a function of this class). In this paper, we introduce a subclass ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Control Systems Letters

سال: 2022

ISSN: ['2475-1456']

DOI: https://doi.org/10.1109/lcsys.2021.3082875